Перевод: со всех языков на все языки

со всех языков на все языки

CG method (conjugate-gradient method)

См. также в других словарях:

  • Conjugate gradient method — A comparison of the convergence of gradient descent with optimal step size (in green) and conjugate vector (in red) for minimizing a quadratic function associated with a given linear system. Conjugate gradient, assuming exact arithmetic,… …   Wikipedia

  • Nonlinear conjugate gradient method — In numerical optimization, the nonlinear conjugate gradient method generalizes the conjugate gradient method to nonlinear optimization. For a quadratic function : The minimum of f is obtained when the gradient is 0: . Whereas linear conjugate… …   Wikipedia

  • Derivation of the conjugate gradient method — In numerical linear algebra, the conjugate gradient method is an iterative method for numerically solving the linear system where is symmetric positive definite. The conjugate gradient method can be derived from several different perspectives,… …   Wikipedia

  • Preconditioned conjugate gradient method — The conjugate gradient method is a numerical algorithm that solves a system of linear equations:A x= b.,where A is symmetric [positive definite] . If the matrix A is ill conditioned, i.e. it has a large condition number kappa(A), it is often… …   Wikipedia

  • Conjugate residual method — The conjugate residual method is an iterative numeric method used for solving systems of linear equations. It s a Krylov subspace method very similar to the much more popular conjugate gradient method, with similar construction and convergence… …   Wikipedia

  • Gradient descent — For the analytical method called steepest descent see Method of steepest descent. Gradient descent is an optimization algorithm. To find a local minimum of a function using gradient descent, one takes steps proportional to the negative of the… …   Wikipedia

  • Newton's method in optimization — A comparison of gradient descent (green) and Newton s method (red) for minimizing a function (with small step sizes). Newton s method uses curvature information to take a more direct route. In mathematics, Newton s method is an iterative method… …   Wikipedia

  • Iterative method — In computational mathematics, an iterative method is a mathematical procedure that generates a sequence of improving approximate solutions for a class of problems. A specific implementation of an iterative method, including the termination… …   Wikipedia

  • Biconjugate gradient method — In mathematics, more specifically in numerical analysis, the biconjugate gradient method is an algorithm to solve systems of linear equations :A x= b.,Unlike the conjugate gradient method, this algorithm does not require the matrix A to be self… …   Wikipedia

  • Nelder–Mead method — Nelder–Mead simplex search over the Rosenbrock banana function (above) and Himmelblau s function (below) See simplex algorithm for Dantzig s algorithm for the problem of linear opti …   Wikipedia

  • Newton's method — In numerical analysis, Newton s method (also known as the Newton–Raphson method), named after Isaac Newton and Joseph Raphson, is a method for finding successively better approximations to the roots (or zeroes) of a real valued function. The… …   Wikipedia

Поделиться ссылкой на выделенное

Прямая ссылка:
Нажмите правой клавишей мыши и выберите «Копировать ссылку»